Generalized Relative Information and Information Inequalities

نویسنده

  • INDER JEET TANEJA
چکیده

In this paper, we have obtained bounds on Csiszár’s f-divergence in terms of relative information of type s using Dragomir’s [9] approach. The results obtained in particular lead us to bounds in terms of χ−Divergence, Kullback-Leibler’s relative information and Hellinger’s discrimination.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Displacement convexity of generalized relative entropies. II

We introduce a class of generalized relative entropies (inspired by the Bregman divergence in information theory) on the Wasserstein space over a weighted Riemannian or Finsler manifold. We prove that the convexity of all the entropies in this class is equivalent to the combination of the non-negative weighted Ricci curvature and the convexity of another weight function used in the definition o...

متن کامل

Generalized Interlacing Inequalities

We discuss some applications of generalized interlacing inequalities of Ky Fan to the study of (a) some classical matrix inequalities and (b) matrix problems in quantum information science. AMS Classification 15A18, 15A57, 15A60, 15A90.

متن کامل

On Hadamard and Fej'{e}r-Hadamard inequalities for Caputo $small{k}$-fractional derivatives

In this paper we will prove certain Hadamard and Fejer-Hadamard inequalities for the functions whose nth derivatives are convex by using Caputo k-fractional derivatives. These results have some relationship with inequalities for Caputo fractional derivatives.

متن کامل

New bounds for the generalized Marcum Q-function

In this paper, we study the generalized Marcum -function where and . Our aim is to extend the results of Corazza and Ferrari (IEEE Trans. Inf. Theory, vol. 48, pp. 3003–3008, 2002) to the generalized Marcum -function in order to deduce some new tight lower and upper bounds. The key tools in our proofs are some monotonicity properties of certain functions involving the modified Bessel function o...

متن کامل

Cramér-Rao and moment-entropy inequalities for Renyi entropy and generalized Fisher information

The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam’s inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The CramérRao inequality is a direct consequence of these two inequalities. In this paper the inequalities above are ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003